Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
نویسندگان
چکیده
منابع مشابه
Error bounds, quadratic growth, and linear convergence of proximal methods
Authors are encouraged to submit new papers to INFORMS journals by means of a style file template, which includes the journal title. However, use of a template does not certify that the paper has been accepted for publication in the named journal. INFORMS journal templates are for the exclusive purpose of submitting to an INFORMS journal and should not be used to distribute the papers in print ...
متن کاملLinear Convergence of Proximal Incremental Aggregated Gradient Methods under Quadratic Growth Condition
Under the strongly convex assumption, several recent works studied the global linear convergence rate of the proximal incremental aggregated gradient (PIAG) method for minimizing the sum of a large number of smooth component functions and a non-smooth convex function. In this paper, under the quadratic growth condition–a strictly weaker condition than the strongly convex assumption, we derive a...
متن کاملthe effects of error correction methods on pronunciation accuracy
هدف از انجام این تحقیق مشخص کردن موثرترین متد اصلاح خطا بر روی دقت آهنگ و تاکید تلفظ کلمه در زبان انگلیسی بود. این تحقیق با پیاده کردن چهار متد ارائه اصلاح خطا در چهار گروه، سه گروه آزمایشی و یک گروه تحت کنترل، انجام شد که گروه های فوق الذکر شامل دانشجویان سطح بالای متوسط کتاب اول passages بودند. گروه اول شامل 15، دوم 14، سوم 15 و آخرین 16 دانشجو بودند. دوره مربوطه به مدت 10 هفته ادامه یافت و د...
15 صفحه اولQuadratic Convergence of Vortex Methods
We prove quadratic convergence for two-dimensional vortex methods with positive cutoffs. The result is established for flows with initial vorticity three times continuously differentiable and compact support. The proof is based on a refined version of a convergence result. Introduction. The purpose of this paper is to prove that vortex methods with positive cutoffs can converge quadratically if...
متن کاملTight Global Linear Convergence Rate Bounds for Operator Splitting Methods
In this paper we establish necessary and sufficient conditions for linear convergence of operator splitting methods for a general class of convex optimization problems where the associated fixed-point operator is averaged. We also provide a tight bound on the achievable convergence rate. Most existing results establishing linear convergence in such methods require restrictive assumptions regard...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Mathematics of Operations Research
سال: 2018
ISSN: 0364-765X,1526-5471
DOI: 10.1287/moor.2017.0889